AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Model distillation

# Model distillation

DMM
Apache-2.0
DMM is a score distillation-based model fusion paradigm that compresses multiple pre-trained models from different domains into a versatile text-to-image generation model.
Image Generation
D
MCG-NJU
115
15
Deepseek R1 Distill Qwen 32B Lora R32
This is a LoRA adapter extracted from DeepSeek-R1-Distill-Qwen-32B, based on the Qwen2.5-32B base model, suitable for parameter-efficient fine-tuning.
Large Language Model Transformers
D
Naozumi0512
109
2
Distilbart Mnli 12 9
DistilBart-MNLI is a lightweight version distilled from bart-large-mnli using teacher-free distillation technology, maintaining high accuracy while reducing model complexity.
Text Classification
D
valhalla
8,343
12
Distilbart Mnli 12 6
DistilBart-MNLI is a distilled version of BART-large-MNLI, using teacher-free distillation technology, significantly reducing model size while maintaining high performance.
Text Classification
D
valhalla
49.63k
11
Distilbart Mnli 12 3
DistilBart-MNLI is a distilled version of bart-large-mnli using teacher-free distillation techniques, achieving performance close to the original model while being more lightweight.
Text Classification
D
valhalla
8,791
19
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase